On Kullback-Leibler Loss and Density Estimation

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kullback-Leibler approximation of spectral density functions

We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...

متن کامل

Improved Minimax Prediction Under Kullback-Leibler Loss

Let X | μ ∼ Np(μ, vxI) and Y | μ ∼ Np(μ, vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ, and let p(x|μ) and p(y |μ) denote the conditional densities of X and Y . Based on only observing X = x, we consider the problem of obtaining a predictive distribution p̂(y |x) for Y that is close to p(y |μ) as measured by Kullback-Leibler loss. The natural straw man ...

متن کامل

Entropy and Kullback-Leibler divergence estimation based on Szegö's theorem

In this work, a new technique for the estimation of the Shannon’s entropy and the Kullback-Leibler (KL) divergence for one dimensional data is presented. The estimator is based on the Szegö’s theorem for sequences of Toeplitz matrices, which deals with the asymptotic behavior of the eigenvalues of those matrices, and the analogy between a probability density function (PDF) and a power spectral ...

متن کامل

GRADE ESTIMATION OF KULLBACK - LEIBLER INFORMATION NeTMBEW

An estimator of the Kullback-Leibler information number by using its representation as a functional of the grade density is introduced. Its strong consistency is proved under the mild conditions on the grade density. The same approach is used to study the entropy measure of bivariate dependence (mutual information). Some applications to detection theory are also given.

متن کامل

Notes on Kullback-Leibler Divergence and Likelihood

The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two probability distributions. Although difficult to understand by examining the equation, an intuition and understanding of the KL divergence arises from its intimate relationship with likelihood theory. We discuss how KL divergence arises from likelihood theory in an attempt t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 1987

ISSN: 0090-5364

DOI: 10.1214/aos/1176350606